Goto

Collaborating Authors

 controlling neural level set


Reviews: Controlling Neural Level Sets

Neural Information Processing Systems

This paper addresses the important task of controlling the level sets that comprise the decision boundaries of a neural network. I think the proposed method is quite reasonable, well-described, and convincingly demonstrated to be quite useful across a number of tasks. Re level set sampling: - Doesn't the ReLU activation imply that D_x F(p; theta) is often 0 at many points p? How do you get around this issue when attempting to optimize p toward S(theta)? It seems this optimization might often get stuck in regions where D_x F(p;theta) 0 yet p lies far away from S(theta). Furthermore, the particular choice used by the authors should be a bit better motivated in the text as it's not clear to me. - It seems one can sample from level 0 set by instead just optimizing: min_x F(x;theta) via gradient descent in x.


Reviews: Controlling Neural Level Sets

Neural Information Processing Systems

The paper proposes a method to control the level set of the decision surface of neural networks. The reviewers found the approach to be novel and convincingly demonstrated to work on tasks such as CIFAR-10. Furthermore, the method is shown to increase robustness to adversarial perturbations. The work successfully tackles an important topic and as such would be of interesting to the NeurIPS community.


Controlling Neural Level Sets

Neural Information Processing Systems

The level sets of neural networks represent fundamental properties such as decision boundaries of classifiers and are used to model non-linear manifold data such as curves and surfaces. Thus, methods for controlling the neural level sets could find many applications in machine learning. In this paper we present a simple and scalable approach to directly control level sets of a deep neural network. Our method consists of two parts: (i) sampling of the neural level sets, and (ii) relating the samples' positions to the network parameters. The latter is achieved by a sample network that is constructed by adding a single fixed linear layer to the original network. In turn, the sample network can be used to incorporate the level set samples into a loss function of interest.


Controlling Neural Level Sets

Atzmon, Matan, Haim, Niv, Yariv, Lior, Israelov, Ofer, Maron, Haggai, Lipman, Yaron

Neural Information Processing Systems

The level sets of neural networks represent fundamental properties such as decision boundaries of classifiers and are used to model non-linear manifold data such as curves and surfaces. Thus, methods for controlling the neural level sets could find many applications in machine learning. In this paper we present a simple and scalable approach to directly control level sets of a deep neural network. Our method consists of two parts: (i) sampling of the neural level sets, and (ii) relating the samples' positions to the network parameters. The latter is achieved by a sample network that is constructed by adding a single fixed linear layer to the original network.